Relevance is one of the key things we can use to decompose or semi decompose problems.
We started out with a third technique, which is called inductive logic programming, ILP,
where we're basically using top-down approaches to learn not only new generalized formulae,
but sometimes even extend our language.
That's one of the big problems is where do new concepts come from?
Right now we've only had things where we learned new things that we were able to express in the old language.
That's nice, but we need more.
It doesn't explain where the concepts come from.
Here's an example.
The upshot there will be, and that's here in our example, we have a couple of relations.
We have the marriage relation. We know about individuals, whether they're male or female.
Then we have the mother and father relation.
The thing you want to keep an eye out in this example is the generalization of the mother and father relation,
namely the parent relation.
What I'm trying to show you is that it's good and indeed possible to synthesize the concept of parent.
In our data, we do not have the parent relation.
This is just a visualization, but if you actually look at the data, then you'll see something like
Philip is the father of Charles and Queen Mum is the mother of Margaret and Diana and Charles are married.
Somewhat redundantly, we know that Philip is male.
That goes very well with the father relation and so on.
It's kind of a rigged example for this.
We want to find the grandparent relation.
We want to learn the grandparent relation.
We do that in giving examples.
If you go back to the example, then you can just basically say, oh, George is a grandparent of Anne.
I don't know. Oh, yeah, of Anne, I think.
And of Andrew and of Edward and Charles.
We can derive from this little thing here.
We can actually derive a lot of positive examples, 12 positive examples of which I've shown you.
What was it for or something like that?
So we have a couple of nonpositive examples.
If you take any other pair that is not in the grandparent relation, then you have an example for the negative
Example. That's the setting.
We have a couple of background facts.
We have a couple of positive and negative examples.
And for a relation.
We want to infer a definition of that relation in terms of
Those kind of things.
And of course you can.
Right. X is a grandparent of Y.
If there is a Z such that X is the mother of Z and Z is the mother of Y.
And you can now see why I'm making the example with mother and father because it actually makes the whole thing complicated,
So complicated that I don't actually want to write it all down.
You have four clauses with existence.
That is indeed something you want to try and learn.
And the question is how are we going to do that? And I've convinced you or I tried to convince you that decision tree
Learning is not even applicable because for relation learning,
The attribute kind of paradigm where we only have
Single variable predicates is just not suited.
All the stuff you would like to write down, you cannot write down.
Presenters
Zugänglich über
Offener Zugang
Dauer
00:07:44 Min
Aufnahmedatum
2021-03-30
Hochgeladen am
2021-03-31 11:26:40
Sprache
en-US
Recap: Inductive Logic Programming: An Example
Main video on the topic in chapter 10 clip 5.